A couple of days ago, in a podcast I was listening to, it was mentioned that when people were asked to rate poems, which were either written by a human or an AI, they would, on average, rate the AI’s[1] poems slightly better.
This didnt’t bother me per se. But there is something it reminded me of that has been bothering me for a couple of years now. I once heard that wine tasters, subjected to a similar blind test between wine from a discounter and expensive wine, on average wouldn’t be able to reliably pick the expensive wines out.
I’m not sure whether one would be able to find flaws in the methodology. Back then I believed it as written. Nowadays, with a little more experience, I can spot some, let’s say, subversive hand here. And it was certainly strong enough to shake my faith for years, as you are witnessing.

Takashi Saito preparing sushi 2023 [commons]
But I think it could be true. It could also be that one can find an explanation which restores faith in connoisseurship beyond doubt. Perhaps—maybe even likely—both can be true at the same time. Somehow.
Both cases are a little different. The AI case also bothered me a little less because the sample of people asked to rate were laymen, while in the wine case it was experts. So what if AI art starts getting rated consistently better than human art? I think this causes quite a bit of panic out there and the “human element” gets invoked. But this is an appeal to ethics, which is why such warnings from content creators appear helpless, because they know that nobody cares.[2]
As a “follower” of David Deutsch I think we should just come up with an explanation of what is going on. Whether a thing is created by a human or AI, we should care about how it was created, and why it was created in that way, and why this is good. I came across an article recently by an artist who made the case for how laborious her work is even with AI, and that she employs AI skillfully.[3] I second her argument, also because it reflects current common wisdom that AI-assisted crafting means leverage first and foremost—not really different, seen that way, from CAD (computer-assisted design).
Also, on the note of “explanation,” in my humble opinion a good work of art tells some truth about the world. In that sense, it has to have some predictive power, as truth means that’s how some aspect of the world is reliably known to play out. This means there is a pattern which is captured by an intelligent person invoking all parts of their embodied being. If AI gets to that point, I’m willing—again with Deutsch—to reconsider and call it a person and grant it an equal say. So, for now, I am putting myself behind the ‘human spark’ theory of why AI-generated art is slop.
And going forward, with steadily growing urgency, I will want to know _how_[4] a piece of art has been made—the creative process involved in making it. I want to peek inside the kitchen of the sushi chef and see the documentary about them, before I give it my verdict of “good.” This is the rebirth of the provenance criterion—for the age of AI.
Here are two pieces I found very interesting. Two worthwhile cultural critiques of AI.
This one discusses the “Dark Forest” theory of the Internet:

Although I’m not agreeing with every bit of it, I thought this was quite good:
Footnotes
Brendan Foody on Teaching AI and the Future of Knowledge Work on Conversations with Tyler.
Or maybe people will polarise around that and then one side will care too much.
Stop the Anti-AI Slop! The Ultimate Answer To Kneejerk Anti-AI Art Hate. Substack.
I once watched a documentary on animation artist Nina Paley and how she used a specific version of a software to create quite stunning results, from what I’ve seen. She was so annoyed when that software was replaced by a successor and she had to change her process. Because everything here is about the process.
Another very similar story that always made me raise my eyebrows was about the artist Burial, whose production process was based on audio editing software like Sound Forge rather than DAWs.